Sparse algorithm for robust LSSVM in primal space

نویسندگان

  • Li Chen
  • Shuisheng Zhou
چکیده

As enjoying the closed form solution, least squares support vector machine (LSSVM) has been widely used for classification and regression problems having the comparable performance with other types of SVMs. However, LSSVM has two drawbacks: sensitive to outliers and lacking sparseness. Robust LSSVM (R-LSSVM) overcomes the first partly via nonconvex truncated loss function, but the current algorithms for R-LSSVM with the dense solution are faced with the second drawback and are inefficient for training large-scale problems. In this paper, we interpret the robustness of R-LSSVM from a re-weighted viewpoint and give a primal R-LSSVM by the representer theorem. The new model may have sparse solution if the corresponding kernel matrix has low rank. Then approximating the kernel matrix by a low-rank matrix and smoothing the loss function by entropy penalty function, we propose a convergent sparse R-LSSVM (SR-LSSVM) algorithm to achieve the sparse solution of primal R-LSSVM, which overcomes two drawbacks of LSSVM simultaneously. The proposed algorithm has lower complexity than the existing algorithms and is very efficient for training large-scale problems. Many experimental results illustrate that SR-LSSVM can achieve better or comparable performance with less training time than related algorithms, especially for training large scale problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Study of Wind Speed Prediction Based on Particle Swarm Algorithm to Optimize the Parameters of Sparse Least Squares Support Vector

Wind speed forecasting can accurately improve prediction efficiency of wind power in wind farm, decrease failure probability of wind turbine, and extend life cycle. An innovative algorithm is proposed to optimize both the parameters of least squares support vector machine (LSSVM) and the procedure of finding sparse support vector. Firstly, the defects of support vector are analyzed. Then inequa...

متن کامل

Least Squares Support Vector Machines and Primal Space Estimation

In this paper a methodology for estimation in kernel-induced feature spaces is presented, making a link between the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM) and classical statistical inference techniques in order to perform linear regression in primal space. This is done by computing a finite dimensional approximation of the kernel-induced feature space mapping ...

متن کامل

The Application of Least Square Support Vector Machine as a Mathematical Algorithm for Diagnosing Drilling Effectivity in Shaly Formations

The problem of slow drilling in deep shale formations occurs worldwide causing significant expenses to the oil industry. Bit balling which is widely considered as the main cause of poor bit performance in shales, especially deep shales, is being drilled with water-based mud. Therefore, efforts have been made to develop a model to diagnose drilling effectivity. Hence, we arrived at graphical cor...

متن کامل

Active Learning for Sparse Least Squares Support Vector Machines

For least squares support vector machine (LSSVM) the lack of sparse, while the standard sparse algorithm exist a problem that it need to mark all of training data. We propose an active learning algorithm based on LSSVM to solve sparse problem. This method first construct a minimum classification LSSVM, and then calculate the uncertainty of the sample, select the closest category to mark the sam...

متن کامل

Sparse Reductions for Fixed-Size Least Squares Support Vector Machines on Large Scale Data

Fixed-Size Least Squares Support Vector Machines (FS-LSSVM) is a powerful tool for solving large scale classification and regression problems. FS-LSSVM solves an over-determined system of M linear equations by using Nyström approximations on a set of prototype vectors (PVs) in the primal. This introduces sparsity in the model along with ability to scale for large datasets. But there exists no f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 275  شماره 

صفحات  -

تاریخ انتشار 2018